Journal article
Incorporating word attention with convolutional neural networks for abstractive summarization
C Yuan, Z Bao, M Sanderson, Y Tang
World Wide Web | SPRINGER | Published : 2020
Abstract
Neural sequence-to-sequence (seq2seq) models have been widely used in abstractive summarization tasks. One of the challenges of this task is redundant contents in the input document often confuses the models and leads to poor performance. An efficient way to solve this problem is to select salient information from the input document. In this paper, we propose an approach that incorporates word attention with multilayer convolutional neural networks (CNNs) to extend a standard seq2seq model for abstractive summarization. First, by concentrating on a subset of source words during encoding an input sentence, word attention is able to extract informative keywords in the input, which gives us the..
View full abstractGrants
Awarded by Appalachian Regional Commission